341 research outputs found

    Understanding How Personality Affects the Acceptance of Technology: A Literature Review

    Full text link
    The aim of this literature review is to summarize the current state of research on the influence of the extended Big Five personality traits on the acceptance of technology and to uncover inconsistencies and gaps in knowledge. It focuses on the question of how the characteristics openness to experience, extraversion, agreeableness, conscientiousness, neuroticism and willingness to take risks affect people's acceptance of new technologies. Within the framework of the literature review, a total of 378 topic-relevant results were analyzed and ultimately a sample of 22 studies selected to reflect the current state of research. Upon review, most of these studies provide significant results for each of the six personality traits. Furthermore, it was found that most researchers use the Technology Acceptance Model (TAM) to measure technology acceptance and that the samples consisted mainly of students. In view of the increasing use of intelligent technologies in almost all areas of life, it is particularly important to continuously investigate the factors influencing technology acceptance - and to do so in a representative way for all social classes

    The Iray Light Transport Simulation and Rendering System

    Full text link
    While ray tracing has become increasingly common and path tracing is well understood by now, a major challenge lies in crafting an easy-to-use and efficient system implementing these technologies. Following a purely physically-based paradigm while still allowing for artistic workflows, the Iray light transport simulation and rendering system allows for rendering complex scenes by the push of a button and thus makes accurate light transport simulation widely available. In this document we discuss the challenges and implementation choices that follow from our primary design decisions, demonstrating that such a rendering system can be made a practical, scalable, and efficient real-world application that has been adopted by various companies across many fields and is in use by many industry professionals today

    Smart Cities as Focal Entities for Strategic Communication - Considering the Public's Concerns Regarding the Use of Information and Communication Technology

    Full text link
    This paper addresses the people’s knowledge, acceptance and attitude towards the concept of the Smart City. Therefore, inhabitants of Leipzig (Germany) and Tallinn (Estonia) were surveyed online and asked to evaluate 10 technologies that can be used in a Smart City and to rate the Smart City concept itself. First, results show significant differences in the level of knowledge and acceptance towards smart technologies between citizens of Leipzig and Tallinn. In addition, the data provides information on the extent to which citizens are willing to live in a Smart City and how they perceive its advantages. Second, the data provides information about perceived opportunities and risks towards the Smart City and thus gives information about which aspects should be addressed in future strategic communication in order to increase the people’s trust and acceptance

    Evaluating the long short-term memory (LSTM) network for discharge prediction under changing climate conditions

    Full text link
    Better understanding the predictive capabilities of hydrological models under contrasting climate conditions will enable more robust decision-making. Here, we tested the ability of the long short-term memory (LSTM) for daily discharge prediction under changing conditions using six snow-influenced catchments in Switzerland. We benchmarked the LSTM using the Hydrologiska Byråns Vattenbalansavdelning (HBV) bucket-type model with two parameterizations. We compared the model performance under changing conditions against constant conditions and tested the impact of the time-series size used in calibration on the model performance. When calibrated, the LSTM resulted in a much better fit than the HBV. However, in validation, the performance of the LSTM dropped considerably, and the fit was as good or poorer than the HBV performance in validation. Using longer time series in calibration improved the robustness of the LSTM, whereas HBV needed fewer data to ensure a robust parameterization. When using the maximum number of years in calibration, the LSTM was considered robust to simulate discharges in a drier period than the one used in calibration. Overall, the HBV was found to be less sensitive for applications under contrasted climates than the data-driven model. However, other LSTM modeling setups might be able to improve the transferability between different conditions

    Mythos Praxis um jeden Preis? Die Wurzeln und Modellierung des Lehr-Lern-Labors

    Get PDF
    Lehr-Lern-Labor-Seminare sind seit vielen Jahren in der Naturwissenschaftsdidaktik und seit kurzem auch in der geisteswissenschaftlichen Didaktik anzutreffende universitäre Seminarkonzepte, die mehrmalige Schüler*innenbesuche an der Universität implementieren. Diese bewirken auf den ersten Blick eine Erhöhung der Praxisanteile im Lehramtsstudium, die dem Wunsch nach mehr Praxis vieler Studierender entgegenzukommen scheint und dem zu theoretisch anmutenden Lehramtsstudium entgegenwirkt. Analysiert man die Situation allerdings genauer, so erscheint nicht ein „mehr“ an Praxis, sondern „bessere“ d.h. mit Theorie vernetzte, reflektierte Praxis als wünschenswert für eine moderne, adäquate Lehramtsausbildung in der ersten Phase. Die Lehr-Lern-Labor-Seminare können eine solche, „bessere“ Praxis bieten, sie müssen allerdings eine Reihe von Kriterien erfüllen. Hierzu zählt nicht nur eine längst überfällig erscheinende Definition und theoretische Modellierung des Lehrformats, sondern auch die explizite Einbettung theoretischer und reflexiver Phasen. Ersteres wird in diesem Beitrag erarbeitet und vorgestellt, letzteres bedarf einer empirischen Prüfung, die auch im Lehr-Lern-Labor-Seminar-Projekt der Freien Universität Berlin vorgenommen wird

    Reconstructing microstructures from statistical descriptors using neural cellular automata

    Full text link
    The problem of generating microstructures of complex materials in silico has been approached from various directions including simulation, Markov, deep learning and descriptor-based approaches. This work presents a hybrid method that is inspired by all four categories and has interesting scalability properties. A neural cellular automaton is trained to evolve microstructures based on local information. Unlike most machine learning-based approaches, it does not directly require a data set of reference micrographs, but is trained from statistical microstructure descriptors that can stem from a single reference. This means that the training cost scales only with the complexity of the structure and associated descriptors. Since the size of the reconstructed structures can be set during inference, even extremely large structures can be efficiently generated. Similarly, the method is very efficient if many structures are to be reconstructed from the same descriptor for statistical evaluations. The method is formulated and discussed in detail by means of various numerical experiments, demonstrating its utility and scalability

    Selection and optimization of a suitable pretreatment method for miscanthus and poplar raw material

    Get PDF
    Miscanthus and poplar are very promising second‐generation feedstocks due to the high growth rates and low nutrient demand. The aim of the study was to develop a systematic approach for choosing suitable pretreatment methods evaluated with the modified severity factor (log R"^{"}o_{o}). Optimal pretreatment results in a high delignification grade, low cellulose solubilization and increased accessibility for enzymatic hydrolysis while revealing minimal log R"^{"}o_{o} values. In order to do so, several reaction approaches were compared. Acid‐catalyzed organosolv processing carried out for miscanthus and poplar revealed the highest delignification grade leading to a relatively high glucose yield after enzymatic saccharification. In both cases, a design of experiments approach was used to study the influence of relevant parameters. Modeling the data resulted in the identification of optimum pretreatment conditions for miscanthus with concentrations of 0.16% H2_{2}SO4_{4} and 50% EtOH at 185°C for a retention time of 60 min. Experimental validation of these conditions revealed an even higher delignification degree (88%) and glucose yield (85%) than predicted. 0.19% H2_{2}SO4_{4} and 50% EtOH were determined as optimum concentrations, 182°C and 48 min identified as optimum pretreatment conditions for poplar; the delignification degree was 84% and the resulting glucose yield 70%

    Measurement of scintillation efficiency for nuclear recoils in liquid argon

    Get PDF
    The scintillation light yield of liquid argon from nuclear recoils relative to electronic recoils has been measured as a function of recoil energy from 10 keVr up to 250 keVr at zero electric field. The scintillation efficiency, defined as the ratio of the nuclear recoil scintillation response to the electronic recoil response, is 0.25±0.01+0.01 (correlated) above 20 keVr. © 2012 American Physical Society

    GOVERNANÇA CORPORATIVA: PLANEJAMENTO ESTRATÉGICO E OS CONFLITOS DE AGÊNCIA NA EMPRESA FAMILIAR

    Get PDF
    Governança corporativa está relacionada à forma de direção e monitoramento de entidades. Boas práticas de governança alinham interesses e agregam valor e qualidade à gestão e à organização. O artigo busca analisar os potenciais conflitos de agência em uma empresa familiar em expansão e apresentar as possibilidades de seu equacionamento por meio de uma macroestrutura de planejamento estratégico. A pesquisa se caracteriza como aplicada, exploratória e descritiva quanto aos objetivos, qualitativa em sua abordagem e estudo de caso quanto a procedimentos técnicos. Como resultados foram identificados os agentes e suas relações, os prováveis conflitos de agência e alinhamento, bem como foi sugerido uma macroestrutura de planejamento estratégico com o objetivo central de cumprimento da missão da empresa familiar, considerando a sucessão, a gestão e o planejamento estratégico enfatizando que sua utilização resulta na prevenção, redução e equacionamento de conflitos de agência identificados. O estudo contribui empiricamente pela possibilidade de aplicação da macroestrutura de planejamento estratégico em outras empresas familiares do mesmo porte e em futuros estudos por inter-relacionar os conflitos de agência com o planejamento estratégico organizacional
    corecore